11,013 research outputs found

    Measuring the transition to homogeneity with photometric redshift surveys

    Full text link
    We study the possibility of detecting the transition to homogeneity using photometric redshift catalogs. Our method is based on measuring the fractality of the projected galaxy distribution, using angular distances, and relies only on observable quantites. It thus provides a way to test the Cosmological Principle in a model-independent unbiased way. We have tested our method on different synthetic inhomogeneous catalogs, and shown that it is capable of discriminating some fractal models with relatively large fractal dimensions, in spite of the loss of information due to the radial projection. We have also studied the influence of the redshift bin width, photometric redshift errors, bias, non-linear clustering, and surveyed area, on the angular homogeneity index H2 ({\theta}) in a {\Lambda}CDM cosmology. The level to which an upcoming galaxy survey will be able to constrain the transition to homogeneity will depend mainly on the total surveyed area and the compactness of the surveyed region. In particular, a Dark Energy Survey (DES)-like survey should be able to easily discriminate certain fractal models with fractal dimensions as large as D2 = 2.95. We believe that this method will have relevant applications for upcoming large photometric redshift surveys, such as DES or the Large Synoptic Survey Telescope (LSST).Comment: 14 pages, 14 figure

    Interacting dark sector with variable vacuum energy

    Get PDF
    We examine a cosmological scenario where dark matter is coupled to a variable vacuum energy while baryons and photons are two decoupled components for a spatially flat Friedmann-Robertson-Walker spacetime. We apply the χ2\chi^{2} method to the updated observational Hubble data for constraining the cosmological parameters and analyze the amount of dark energy in the radiation era. We show that our model fulfills the severe bound of Ωx(z1100)<0.009\Omega_{x}(z\simeq 1100)<0.009 at the 2σ2\sigma level, so it is consistent with the recent analysis that includes cosmic microwave background anisotropy measurements from the Planck survey, the Atacama Cosmology Telescope, and the South Pole Telescope along with the future constraints achievable by the Euclid and CMBPol experiments, and fulfills the stringent bound Ωx(z1010)<0.04\Omega_{x}(z\simeq 10^{10})<0.04 at the 2σ2\sigma level in the big-bang nucleosynthesis epoch.Comment: 5 pages,3 figures, 2 tables. (http://prd.aps.org/abstract/PRD/v88/i8/e087301

    Osuna contra Lemos: la polémica del Panegyricus

    Get PDF
    La imprenta napolitana del Seicento produjo un notable número de publicaciones en castellano, en latín y en italiano que, pertenecientes al vasto campo de la literatura celebrativa, se presentan como documentos en donde las instancias del poder empapan la materia literaria: los textos se convierten en signos de institución, productos de una práctica escritoria que responde a ciertas exigencias del poder constituido. El artículo examina los más destacados ejemplos de este tipo de literatura durante el virreinato del VII Conde de Lemos (1610-1616) y compara la abundancia de esta práctica escritoria en ese arco de tiempo con la escasez que se manifiesta durante el virreinato del III Duque de Osuna (1616-1620). The Neapolitan printing from the Seicento produced a remarkable number of publications in Spanish, Latin and Italian belonging to the vast field of laudatory literature. They are presented as documents in which the corridors of power pervade the literary stuff; texts turn into signs of institution, into the products of the writing practice that responds to certain demands of the established power. The article explores the most salient examples of this kind of literature during the viceroyalty of the seventh Count of Lemos (1610-1616) and it also compares the plenty of this writing practice in this concrete lapse of time with the manifest scarcity during the viceroyalty of the third Duke of Osuna (1616- 1620)

    Medición informal del p-valor mediante simulación

    Get PDF
    Existe un creciente reconocimiento de la importancia de desarrollar el razonamiento inferencial informal (RII) antes de aprender los conceptos formalmente. No obstante todavía hay poca investigación sobre su desarrollo en el aula en el nivel bachillerato (15-18 años). En situaciones informales, la principal dificultad es medir el p-valor de un estadístico, debido que no se tiene la noción de distribución muestral de forma natural. En el presente estudio se analiza el razonamiento de estudiantes de bachillerato para medir el p-valor de un estadístico al usar una aplicación dinámica que crea una distribución muestral empírica mediante la simulación computarizada. Se encuentra que la mayoría de los estudiantes miden el p-valor adecuadamente con ayuda de la simulación, lo que representa un cambio significativo en el RII

    A Mathematical Model to Study the Meningococcal Meningitis

    Get PDF
    AbstractThe main goal of this work is to introduce a novel mathematical model to study the spreading of meningococcal meningitis. Specifically, it is a discrete mathematical model based on cellular automata where the population is divided in five classes: sus- ceptible, asymptomatic infected, infected with symptoms, carriers, recovered and died. It catches the individual characteristics of people in order to give a prediction of both the individual behavior, and whole evolution of population

    A Highly Available Cluster of Web Servers with Increased Storage Capacity

    Get PDF
    Ponencias de las Decimoséptimas Jornadas de Paralelismo de la Universidad de Castilla-La Mancha celebradas el 18,19 y 20 de septiembre de 2006 en AlbaceteWeb servers scalability has been traditionally solved by improving software elements or increasing hardware resources of the server machine. Another approach has been the usage of distributed architectures. In such architectures, usually, file al- location strategy has been either full replication or full distribution. In previous works we have showed that partial replication offers a good balance between storage capacity and reliability. It offers much higher storage capacity while reliability may be kept at an equivalent level of that from fully replicated solutions. In this paper we present the architectural details of Web cluster solutions adapted to partial replication. We also show that partial replication does not imply a penalty in performance over classical fully replicated architectures. For evaluation purposes we have used a simulation model under the OMNeT++ framework and we use mean service time as a performance comparison metric.Publicad

    Full two-photon downconversion of just a single photon

    Get PDF
    We demonstrate, both numerically and analytically, that it is possible to generate two photons from one and only one photon. We characterize the output two photon field and make our calculations close to reality by including losses. Our proposal relies on real or artificial three-level atoms with a cyclic transition strongly coupled to a one-dimensional waveguide. We show that close to perfect downconversion with efficiency over 99% is reachable using state-of-the-art Waveguide QED architectures such as photonic crystals or superconducting circuits. In particular, we sketch an implementation in circuit QED, where the three level atom is a transmon

    Software basado en el método de elementos finitos para la enseñanza de electromagnetismo

    Get PDF
    Se presenta una herramienta software para la enseñanza de electromagnetismo. Ésta, permite analizar una gran variedad de fenómenos electromagnéticos en las bandas de microondas y ondas milimétricas. Dos aspectos muy cuidados en el desarrollo han sido la portabilidad y la difusión, estando disponible para sistemas operativos Linux y Windows y en diferentes idiomas. El interfaz gráfico, está basado en un pre-post procesador de propósito general para análisis computacionaL Por otra parte, el análisis electromagnético está asentado en programas autónomos desarrollados con propósitos de investigación, que hacen uso del Método de los Elementos Finitos. La flexibilidad en el diseño de la arquitectura del software desarrollado, permite una fácil introducción de nuevos núcleos de cálculo. De este modo, aumenta periódicamente el número de problemas electromagnéticos que pueden ser analizados.Peer Reviewe
    corecore